-
Couldn't load subscription status.
- Fork 369
Nathan unify modelargs #609
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…eval into nathan-unify-modelargs
| padding_side="left", | ||
| truncation_side="left", | ||
| ) | ||
| except FileNotFoundError: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this should not be needed as it is an issue with tokenizer version or config
|
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
- removed override batch size and made it part of model config - remove env config, config should be part of env directly, no need to load it up - better loading of models - baseclass for model configs, allows parsing of model config through cli or config file. - Unified naming for model args, i.e. `model_name` - removed openai endpoint, we can just use litellm for this, same for TGI and inference endpoints we don't really need it and it's better to have one interface
|
see: #857 |
- removed override batch size and made it part of model config - remove env config, config should be part of env directly, no need to load it up - better loading of models - baseclass for model configs, allows parsing of model config through cli or config file. - Unified naming for model args, i.e. `model_name` - removed openai endpoint, we can just use litellm for this, same for TGI and inference endpoints we don't really need it and it's better to have one interface
model_name